8,317 research outputs found
A System for Accessible Artificial Intelligence
While artificial intelligence (AI) has become widespread, many commercial AI
systems are not yet accessible to individual researchers nor the general public
due to the deep knowledge of the systems required to use them. We believe that
AI has matured to the point where it should be an accessible technology for
everyone. We present an ongoing project whose ultimate goal is to deliver an
open source, user-friendly AI system that is specialized for machine learning
analysis of complex data in the biomedical and health care domains. We discuss
how genetic programming can aid in this endeavor, and highlight specific
examples where genetic programming has automated machine learning analyses in
previous projects.Comment: 14 pages, 5 figures, submitted to Genetic Programming Theory and
Practice 2017 worksho
Spatially Uniform ReliefF (SURF) for computationally-efficient filtering of gene-gene interactions
<p>Abstract</p> <p>Background</p> <p>Genome-wide association studies are becoming the de facto standard in the genetic analysis of common human diseases. Given the complexity and robustness of biological networks such diseases are unlikely to be the result of single points of failure but instead likely arise from the joint failure of two or more interacting components. The hope in genome-wide screens is that these points of failure can be linked to single nucleotide polymorphisms (SNPs) which confer disease susceptibility. Detecting interacting variants that lead to disease in the absence of single-gene effects is difficult however, and methods to exhaustively analyze sets of these variants for interactions are combinatorial in nature thus making them computationally infeasible. Efficient algorithms which can detect interacting SNPs are needed. ReliefF is one such promising algorithm, although it has low success rate for noisy datasets when the interaction effect is small. ReliefF has been paired with an iterative approach, Tuned ReliefF (TuRF), which improves the estimation of weights in noisy data but does not fundamentally change the underlying ReliefF algorithm. To improve the sensitivity of studies using these methods to detect small effects we introduce Spatially Uniform ReliefF (SURF).</p> <p>Results</p> <p>SURF's ability to detect interactions in this domain is significantly greater than that of ReliefF. Similarly SURF, in combination with the TuRF strategy significantly outperforms TuRF alone for SNP selection under an epistasis model. It is important to note that this success rate increase does not require an increase in algorithmic complexity and allows for increased success rate, even with the removal of a nuisance parameter from the algorithm.</p> <p>Conclusion</p> <p>Researchers performing genetic association studies and aiming to discover gene-gene interactions associated with increased disease susceptibility should use SURF in place of ReliefF. For instance, SURF should be used instead of ReliefF to filter a dataset before an exhaustive MDR analysis. This change increases the ability of a study to detect gene-gene interactions. The SURF algorithm is implemented in the open source Multifactor Dimensionality Reduction (MDR) software package available from <url>http://www.epistasis.org</url>.</p
Accelerating epistasis analysis in human genetics with consumer graphics hardware
BACKGROUND: Human geneticists are now capable of measuring more than one million DNA sequence variations from across the human genome. The new challenge is to develop computationally feasible methods capable of analyzing these data for associations with common human disease, particularly in the context of epistasis. Epistasis describes the situation where multiple genes interact in a complex non-linear manner to determine an individual's disease risk and is thought to be ubiquitous for common diseases. Multifactor Dimensionality Reduction (MDR) is an algorithm capable of detecting epistasis. An exhaustive analysis with MDR is often computationally expensive, particularly for high order interactions. This challenge has previously been met with parallel computation and expensive hardware. The option we examine here exploits commodity hardware designed for computer graphics. In modern computers Graphics Processing Units (GPUs) have more memory bandwidth and computational capability than Central Processing Units (CPUs) and are well suited to this problem. Advances in the video game industry have led to an economy of scale creating a situation where these powerful components are readily available at very low cost. Here we implement and evaluate the performance of the MDR algorithm on GPUs. Of primary interest are the time required for an epistasis analysis and the price to performance ratio of available solutions. FINDINGS: We found that using MDR on GPUs consistently increased performance per machine over both a feature rich Java software package and a C++ cluster implementation. The performance of a GPU workstation running a GPU implementation reduces computation time by a factor of 160 compared to an 8-core workstation running the Java implementation on CPUs. This GPU workstation performs similarly to 150 cores running an optimized C++ implementation on a Beowulf cluster. Furthermore this GPU system provides extremely cost effective performance while leaving the CPU available for other tasks. The GPU workstation containing three GPUs costs 82,500. CONCLUSION: Graphics hardware based computing provides a cost effective means to perform genetic analysis of epistasis using MDR on large datasets without the infrastructure of a computing cluster
Compact High-Velocity Clouds at High Resolution
Six examples of the compact, isolated high-velocity clouds catalogued by
Braun & Burton (1999) and identified with a dynamically cold ensemble of
primitive objects falling towards the barycenter of the Local Group have been
imaged with the Westerbork Synthesis Radio Telescope; an additional ten have
been imaged with the Arecibo telescope. The imaging reveals a characteristic
core/halo morphology: one or several cores of cool, relatively
high-column-density material, are embedded in an extended halo of warmer,
lower-density material. Several of the cores show kinematic gradients
consistent with rotation; these CHVCs are evidently rotationally supported and
dark-matter dominated. The imaging data allows several independent estimates of
the distances to these objects, which lie in the range 0.3 to 1.0 Mpc. The CHVC
properties resemble what might be expected from very dark dwarf irregular
galaxies.Comment: 12 pages, 7 figures, to appear in "The Chemical Evolution of the
Milky Way: Stars versus Clusters", eds. F. Matteuchi and F. Giovannelli,
Kluwer Academic Publisher
Does the face fit the facts? Testing three accounts of age of acquisition effects
Naming and perception tasks show robust effects of age of acquisition (AoA), with faster processing of stimuli learnt earlier in life compared to stimuli acquired later. That AoA effects prove to be more elusive on semantic processing tasks is of importance in attempting to determine the mechanism and locus (or loci) of AoA effects. Three accounts of AoA effects were tested empirically using perceptual familiarity decision tasks to record response latency and accuracy to the faces and names of famous people, with the quantity of semantic knowledge being manipulated. The results do not support the semantic ‘hub’ network or arbitrary mapping explanations of AoA but are consistent with the Set-up of a Specialized Processing System hypothesis
The effects of age of acquisition and semantic congruency on famous person category verification
The age of acquisition (AoA) effect, a processing advantage for items learnt earlier in life, affects naming and making familiarity decisions about famous people. However, its influence on semantic processing tasks involving celebrity stimuli is equivocal. In a category verification task designed to explore this issue further, mature adults were shown an area of fame, followed by a famous person’s name. They were asked to indicate whether the area of fame and the celebrity matched. Stimulus congruency and AoA were manipulated orthogonally, with familiarity and facial distinctiveness being controlled. Faster and more accurate responses were produced when the area of fame and the celebrity matched. Faster and more accurate responses were made to early-acquired celebrities but the interaction fell short of significance but is consistent with that reported for lexical processing. With adequate control of extraneous variables and an extended distance between stimulus groups, AoA would seem to have an influence on the semantic processing of famous people and interacts near significance with congruency. The results are considered in the light of multiple loci theories of AoA
Recommended from our members
Blueswitch: Enabling provably consistent configuration of network switches
Previous research on consistent updates for distributed network
configurations has focused on solutions for centralized networkconfiguration
controllers. However, such work does not address
the complexity of modern switch datapaths. Modern commodity
switches expose opaque configuration mechanisms, with minimal
guarantees for datapath consistency and with unclear configuration
semantics. Furthermore, would-be solutions for distributed consistent
updates must take into account the configuration guarantees
provided by each individual switch – plus the compositional problems
of distributed control and multi-switch configurations that
considerably transcend the single-switch problems. In this paper,
we focus on the behavior of individual switches, and demonstrate
that even simple rule updates result in inconsistent packet switching
in multi-table datapaths. We demonstrate that consistent configuration
updates require guarantees of strong switch-level atomicity
from both hardware and software layers of switches – even in a
single switch. In short, the multiple-switch problems cannot be
reasonably approached until single-switch consistency can be resolved.
We present a hardware design that supports a transactional configuration
mechanism, and provides packet-consistent configuration:
all packets traversing the datapath will encounter either the
old configuration or the new one, and never an inconsistent mix of
the two. Unlike previous work, our design does not require modifications
to network packets. We precisely specify the hardwaresoftware
protocol for switch configuration; this enables us to prove
the correctness of the design, and to provide well-specified invariants
that the software driver must maintain for correctness. We
implement our prototype switch design using the NetFPGA-10G
hardware platform, and evaluate our prototype against commercial
off-the-shelf switches.This work was jointly supported by the Defense Advanced Research
Projects Agency (DARPA) and the Air Force Research Laboratory
(AFRL), under contract FA8750-11-C-0249. The views,
opinions, and/or findings contained in this article/presentation are
those of the author/ presenter and should not be interpreted as representing
the official views or policies, either expressed or implied,
of the Department of Defense or the U.S. Government. We also acknowledge
the support of the UK EPSRC for contributing to parts
of our work, through grant EP/H040536/1. Additional data related
to this publication is available at the http://www.cl.cam.ac.
uk/research/srg/netfpga/blueswitch/ data repository.This is the author accepted manuscript. The final version is available from IEEE via http://dx.doi.org/10.1109/ANCS.2015.711011
An exploratory randomised controlled trial of a premises-level intervention to reduce alcohol-related harm including violence in the United Kingdom
<b>Background</b><p></p>
To assess the feasibility of a randomised controlled trial of a licensed premises intervention to reduce severe intoxication and disorder; to establish effect sizes and identify appropriate approaches to the development and maintenance of a rigorous research design and intervention implementation.<p></p>
<b>Methods</b><p></p>
An exploratory two-armed parallel randomised controlled trial with a nested process evaluation. An audit of risk factors and a tailored action plan for high risk premises, with three month follow up audit and feedback. Thirty-two premises that had experienced at least one assault in the year prior to the intervention were recruited, match paired and randomly allocated to control or intervention group. Police violence data and data from a street survey of study premises’ customers, including measures of breath alcohol concentration and surveyor rated customer intoxication, were used to assess effect sizes for a future definitive trial. A nested process evaluation explored implementation barriers and the fidelity of the intervention with key stakeholders and senior staff in intervention premises using semi-structured interviews.<p></p>
<b>Results</b><p></p>
The process evaluation indicated implementation barriers and low fidelity, with a reluctance to implement the intervention and to submit to a formal risk audit. Power calculations suggest the intervention effect on violence and subjective intoxication would be raised to significance with a study size of 517 premises.<p></p>
<b>Conclusions</b><p></p>
It is methodologically feasible to conduct randomised controlled trials where licensed premises are the unit of allocation. However, lack of enthusiasm in senior premises staff indicates the need for intervention enforcement, rather than voluntary agreements, and on-going strategies to promote sustainability
Efficacy of photochemical internalisation using disulfonated chlorin and porphyrin photosensitisers: An in vitro study in 2D and 3D prostate cancer models
This study shows the therapeutic outcome of Photochemical Internalisation (PCI) in prostate cancer in vitro surpasses that of Photodynamic Therapy (PDT) and could improve prostate PDT in the clinic, whilst avoiding chemotherapeutics side effects. In addition, the study assesses the potential of PCI with two different photosensitisers (TPCS2a and TPPS2a) in prostate cancer cells (human PC3 and rat MatLyLu) using standard 2D monolayer culture and 3D biomimetic model. Photosensitisers were used alone for photodynamic therapy (PDT) or with the cytotoxin saporin (PCI). TPPS2a and TPCS2a were shown to be located in discrete cytoplasmic vesicles before light treatment and redistribute into the cytosol upon light excitation. PC3 cells exhibit a higher uptake than MatLyLu cells for both photosensitisers. In the 2D model, PCI resulted in greater cell death than PDT alone in both cell lines. In 3D model, morphological changes were also observed. Saporin-based toxicity was negligible in PC3 cells, but pronounced in MatLyLu cells (IC50 = 18 nM). In conclusion, the study showed that tumour features such as tumour cell growth rate or interaction with drugs determine therapeutic conditions for optimal photochemical treatment in metastatic prostate cancer
- …